coefficient of agreement

coefficient of agreement
French\ \ coefficient d'agrément; coefficient de concordance
German\ \ Übereinstimmungskoeffizient
Dutch\ \ concordantiecoëfficiënt; overeenstemmingscoëfficiënt
Italian\ \ coefficiente di concordanza; coefficiente di accordo
Spanish\ \ coeficiente de concordancia; coeficiente de coincidencia
Catalan\ \ coeficient de concordància
Portuguese\ \ coeficiente de concordância
Romanian\ \ -
Danish\ \ overensstemmelseskoefficient
Norwegian\ \ samsvarskoeffisient
Swedish\ \ överensstämmelsekoefficient
Greek\ \ συντελεστής συμφωνίας
Finnish\ \ parivertailujen yhtäpitävyyskerroin
Hungarian\ \ egyezési együttható
Turkish\ \ uzlaşma katsayısı; uyuşma katsayısı
Estonian\ \ kooskõlakordaja
Lithuanian\ \ susitarimo koeficientas; suderinimo koeficientas
Slovenian\ \ koeficient sporazuma
Polish\ \ współczynnik zgodności
Russian\ \ показатель соответствия; коэффициент согласования
Ukrainian\ \ коефіцієнт узгодженості
Serbian\ \ коефицијент слагања
Icelandic\ \ stuðullinn samkomulag
Euskara\ \ komunztadura-koefiziente
Farsi\ \ -
Persian-Farsi\ \ -
Arabic\ \ معامل الاتفاق
Afrikaans\ \ ooreenstemmingskoëffisiënt
Chinese\ \ 一 致 系 数
Korean\ \ 합치도계수

Statistical terms. 2014.

Игры ⚽ Нужно решить контрольную?

Look at other dictionaries:

  • coefficient of concordance — noun a coefficient of agreement (concordance) between different sets of rank orderings of the same set of things • Topics: ↑statistics • Hypernyms: ↑Kendall test …   Useful english dictionary

  • Coefficient of determination — In statistics, the coefficient of determination R2 is used in the context of statistical models whose main purpose is the prediction of future outcomes on the basis of other related information. It is the proportion of variability in a data set… …   Wikipedia

  • Kendall tau rank correlation coefficient — The Kendall tau rank correlation coefficient (or simply the Kendall tau coefficient, Kendall s tau; or tau test(s)) is a non parametric statistic used to measure the degree of correspondence between two rankings and assessing the significance of… …   Wikipedia

  • Concordance correlation coefficient — In statistics, the concordance correlation coefficient measures the agreement between two variables, e.g., to evaluate reproducibility or for inter rater reliability. Definition Lawrence Lin has the form of the concordance correlation coefficient …   Wikipedia

  • Phi coefficient — In statistics, the phi coefficient (also referred to as the mean square contingency coefficient and denoted by φ or rφ) is a measure of association for two binary variables introduced by Karl Pearson[1]. This measure is similar to the Pearson… …   Wikipedia

  • tau coefficient of correlation — noun a nonparametric measure of the agreement between two rankings • Syn: ↑Kendall s tau, ↑Kendall rank correlation • Topics: ↑statistics • Hypernyms: ↑Kendall test …   Useful english dictionary

  • Inter-rater reliability — In statistics, inter rater reliability, inter rater agreement, or concordance is the degree of agreement among raters. It gives a score of how much homogeneity, or consensus, there is in the ratings given by judges. It is useful in refining the… …   Wikipedia

  • Cohen's kappa — coefficient is a statistical measure of inter rater agreement or inter annotator agreement[1] for qualitative (categorical) items. It is generally thought to be a more robust measure than simple percent agreement calculation since κ takes into… …   Wikipedia

  • Cohens Kappa — ist ein statistisches Maß für die Interrater Reliabilität von Einschätzungen von (in der Regel) zwei Beurteilern (Ratern), das Jacob Cohen 1960 vorschlug. Dieses Maß kann aber auch für die Intrarater Reliabiliät verwendet werden, bei dem derselbe …   Deutsch Wikipedia

  • Fleiss' Kappa — Cohens Kappa ist ein statistisches Maß für die Interrater Reliabilität von Einschätzungen von (in der Regel) zwei Beurteilern (Ratern), das Jacob Cohen 1960 vorschlug. Die Gleichung für Cohens Kappa lautet wobei p0 der gemessene… …   Deutsch Wikipedia

  • Fleiss' kappa — Cohens Kappa ist ein statistisches Maß für die Interrater Reliabilität von Einschätzungen von (in der Regel) zwei Beurteilern (Ratern), das Jacob Cohen 1960 vorschlug. Die Gleichung für Cohens Kappa lautet wobei p0 der gemessene… …   Deutsch Wikipedia

Share the article and excerpts

Direct link
Do a right-click on the link above
and select “Copy Link”